448 research outputs found
FiBiNET: Combining Feature Importance and Bilinear feature Interaction for Click-Through Rate Prediction
Advertising and feed ranking are essential to many Internet companies such as
Facebook and Sina Weibo. Among many real-world advertising and feed ranking
systems, click through rate (CTR) prediction plays a central role. There are
many proposed models in this field such as logistic regression, tree based
models, factorization machine based models and deep learning based CTR models.
However, many current works calculate the feature interactions in a simple way
such as Hadamard product and inner product and they care less about the
importance of features. In this paper, a new model named FiBiNET as an
abbreviation for Feature Importance and Bilinear feature Interaction NETwork is
proposed to dynamically learn the feature importance and fine-grained feature
interactions. On the one hand, the FiBiNET can dynamically learn the importance
of features via the Squeeze-Excitation network (SENET) mechanism; on the other
hand, it is able to effectively learn the feature interactions via bilinear
function. We conduct extensive experiments on two real-world datasets and show
that our shallow model outperforms other shallow models such as factorization
machine(FM) and field-aware factorization machine(FFM). In order to improve
performance further, we combine a classical deep neural network(DNN) component
with the shallow model to be a deep model. The deep FiBiNET consistently
outperforms the other state-of-the-art deep models such as DeepFM and extreme
deep factorization machine(XdeepFM).Comment: 8 pages,5 figure
MemoNet:Memorizing Representations of All Cross Features Efficiently via Multi-Hash Codebook Network for CTR Prediction
New findings in natural language processing(NLP) demonstrate that the strong
memorization capability contributes a lot to the success of large language
models.This inspires us to explicitly bring an independent memory mechanism
into CTR ranking model to learn and memorize all cross
features'representations. In this paper,we propose multi-Hash Codebook
NETwork(HCNet) as the memory mechanism for efficiently learning and memorizing
representations of all cross features in CTR tasks.HCNet uses multi-hash
codebook as the main memory place and the whole memory procedure consists of
three phases: multi-hash addressing,memory restoring and feature
shrinking.HCNet can be regarded as a general module and can be incorporated
into any current deep CTR model.We also propose a new CTR model named MemoNet
which combines HCNet with a DNN backbone.Extensive experimental results on
three public datasets show that MemoNet reaches superior performance over
state-of-the-art approaches and validate the effectiveness of HCNet as a strong
memory module.Besides, MemoNet shows the prominent feature of big models in
NLP,which means we can enlarge the size of codebook in HCNet to sustainably
obtain performance gains.Our work demonstrates the importance and feasibility
of learning and memorizing representations of all cross features ,which sheds
light on a new promising research direction
Jacobi pseudo-spectral Galerkin method for second kind Volterra integro-differential equations with a weakly singular kernel
The Jacobi pseudo-spectral Galerkin method for the Volterra integro-differential equations of the second kind with a weakly singular kernel is proposed in this paper. We provide a rigorous error analysis for the proposed method, which indicates that the numerical errors (in the Lωα,β2-norm and the L∞-norm) will decay exponentially provided that the source function is sufficiently smooth. Numerical examples are given to illustrate the theoretical results
FiBiNet++: Reducing Model Size by Low Rank Feature Interaction Layer for CTR Prediction
Click-Through Rate (CTR) estimation has become one of the most fundamental
tasks in many real-world applications and various deep models have been
proposed. Some research has proved that FiBiNet is one of the best performance
models and outperforms all other models on Avazu dataset. However, the large
model size of FiBiNet hinders its wider application. In this paper, we propose
a novel FiBiNet++ model to redesign FiBiNet's model structure, which greatly
reduces model size while further improves its performance. One of the primary
techniques involves our proposed "Low Rank Layer" focused on feature
interaction, which serves as a crucial driver of achieving a superior
compression ratio for models. Extensive experiments on three public datasets
show that FiBiNet++ effectively reduces non-embedding model parameters of
FiBiNet by 12x to 16x on three datasets. On the other hand, FiBiNet++ leads to
significant performance improvements compared to state-of-the-art CTR methods,
including FiBiNet
- …